Even artificial intelligence (AI) can be racist and colonialist, some experts say — an issue currently under investigation by academics.

For example, algorithms can generate harmful associations toward Black people, says professor Karine Gentelet of the Université du Québec en Outaouais.

"It's absolutely terrible, it's colonization and racism," she said in an interview with The Canadian Press.

Gentelet, a sociologist and anthropologist specializing in these issues, is taking a critical look at the progress of AI. On Tuesday, she'll take part in a conference of at Laval University focusing on the decolonization of AI, ethics and the rule of law.

AI refers to technologies that allow applications or computers to imitate human intelligence — think algorithms, facial recognition technologies, and autonomous vehicles.

But Gentelet says this technology is far from being "neutral."

AI perpetuates the colonial domination of the West over Indigenous peoples, African, Asian and Oceanic countries, etc., she explained.

"There are power relationships because these are technologies that are heavily funded and often developed in [countries of] the North and then implemented in the South."

In some AI tools, there's a "representation of what the human person is and how he or she interacts in society" that does not necessarily correspond to people outside of the majority group, she continued.

"The representation we have of racialized people in northern societies is not adequate to their contribution in society."

How does this translate into reality? Take the example of health databases, which are used to design new drugs or document health problems.

Some marginalized groups don't go to doctors and therefore don't show up in databases, Gentelet explained.

"There is some degree of pre-existing inequality already in the data, which does not reflect the composition of the population," she continued.

She said communities that are struggling to break through "social invisibility" are still being excluded because the data does not reflect them.

Another example is software tested by Immigration Canada on people who are refugee claimants or in the immigration process.

"The accountability process is much more difficult for them because if they want to complain, where do they go? To Canada, where they are not citizens?"

This field is still relatively new and controversial — there's no unanimity, but there is a growing body of research, Gentelet said.

"Decolonization, in general, is something that is controversial in Canada. There are people who have opposing visions."

She said several solutions are possible, from stricter regulatory oversight to more accountability and investment.

But this controversial issue must be addressed quickly, she stressed. While artificial intelligence is viewed favorably because it solves problems, "there are underlying conditions for solving those problems that will not be fair."

In Gentelet's opinion, "it participates in a recolonization, because technology is a reflection of society." 

-- This report was first published in French by The Canadian Press on April 9, 2022.